DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5 6
Hits 1 – 20 of 102

1
Universals of Linguistic Idiosyncrasy in Multilingual Computational Linguistics ; Universals of Linguistic Idiosyncrasy in Multilingual Computational Linguistics: Dagstuhl Seminar 21351
In: Universals of Linguistic Idiosyncrasy in Multilingual Computational Linguistics ; https://hal.archives-ouvertes.fr/hal-03507948 ; Universals of Linguistic Idiosyncrasy in Multilingual Computational Linguistics, Aug 2021, pp.89--138, 2021, 2192-5283. ⟨10.4230/DagRep.11.7.89⟩ ; https://gitlab.com/unlid/dagstuhl-seminar/-/wikis/home (2021)
BASE
Show details
2
Universals of Linguistic Idiosyncrasy in Multilingual Computational Linguistics (Dagstuhl Seminar 21351)
Croft, William; Savary, Agata; Baldwin, Timothy. - : Dagstuhl Reports. DagRep, Volume 11, Issue 7, 2021
BASE
Show details
3
Universal Dependencies 2.9
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
4
Universal Dependencies 2.8.1
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
5
Universal Dependencies 2.8
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
6
Universal Dependencies ...
BASE
Show details
7
Syntactic Nuclei in Dependency Parsing -- A Multilingual Exploration ...
Basirat, Ali; Nivre, Joakim. - : arXiv, 2021
BASE
Show details
8
Revisiting Negation in Neural Machine Translation ...
BASE
Show details
9
Universals of Linguistic Idiosyncrasy in Multilingual Computational Linguistics (Dagstuhl Seminar 21351) ...
Baldwin, Timothy; Croft, William; Nivre, Joakim. - : Schloss Dagstuhl - Leibniz-Zentrum für Informatik, 2021
BASE
Show details
10
Attention Can Reflect Syntactic Structure (If You Let It) ...
BASE
Show details
11
What Should/Do/Can LSTMs Learn When Parsing Auxiliary Verb Constructions? ...
NAACL 2021 2021; de Lhoneux, Miryam; Nivre, Joakim. - : Underline Science Inc., 2021
BASE
Show details
12
Schrödinger's Tree -- On Syntax and Neural Language Models ...
Kulmizev, Artur; Nivre, Joakim. - : arXiv, 2021
BASE
Show details
13
I’ve got a construction looks funny – representing and recovering non-standard constructions in UD
Ruppenhofer, Josef [Verfasser]; Rehbein, Ines [Verfasser]; de Marneffe, Marie-Catherine [Herausgeber]. - Mannheim : Leibniz-Institut für Deutsche Sprache (IDS), Bibliothek, 2020
DNB Subject Category Language
Show details
14
Universal Dependencies 2.7
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2020
BASE
Show details
15
Universal Dependencies 2.6
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2020
BASE
Show details
16
Køpsala: Transition-Based Graph Parsing via Efficient Training and Effective Encoding ...
BASE
Show details
17
Understanding Pure Character-Based Neural Machine Translation: The Case of Translating Finnish into English ...
BASE
Show details
18
Understanding Pure Character-Based Neural Machine Translation: The Case of Translating Finnish into English ...
Tang, Gongbo; Sennrich, Rico; Nivre, Joakim. - : International Committee on Computational Linguistics, 2020
BASE
Show details
19
Universal Dependencies v2: An Evergrowing Multilingual Treebank Collection ...
BASE
Show details
20
Do Neural Language Models Show Preferences for Syntactic Formalisms? ...
Abstract: Recent work on the interpretability of deep neural language models has concluded that many properties of natural language syntax are encoded in their representational spaces. However, such studies often suffer from limited scope by focusing on a single language and a single linguistic formalism. In this study, we aim to investigate the extent to which the semblance of syntactic structure captured by language models adheres to a surface-syntactic or deep syntactic style of analysis, and whether the patterns are consistent across different languages. We apply a probe for extracting directed dependency trees to BERT and ELMo models trained on 13 different languages, probing for two different syntactic annotation styles: Universal Dependencies (UD), prioritizing deep syntactic relations, and Surface-Syntactic Universal Dependencies (SUD), focusing on surface structure. We find that both models exhibit a preference for UD over SUD - with interesting variations across languages and layers - and that the strength ... : ACL 2020 ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences; Machine Learning cs.LG
URL: https://dx.doi.org/10.48550/arxiv.2004.14096
https://arxiv.org/abs/2004.14096
BASE
Hide details

Page: 1 2 3 4 5 6

Catalogues
10
1
8
0
5
0
1
Bibliographies
20
0
0
0
0
0
0
0
2
Linked Open Data catalogues
0
Online resources
1
0
0
0
Open access documents
62
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern